 Shannon, Claude Elwood

[br]b. 30 April 1916 Gaylord, Michigan, USA[br]American mathematician, creator of information theory.[br]As a child, Shannon tinkered with radio kits and enjoyed solving puzzles, particularly cryptographic ones. He graduated from the University of Michigan in 1936 with a Bachelor of Science in mathematics and electrical engineering, and earned his Master's degree from the Massachusetts Institute of Technology (MIT) in 1937. His thesis on applying Boolean algebra to switching circuits has since been acclaimed as possibly the most significant this century. Shannon earned his PhD in mathematics from MIT in 1940 with a dissertation on the mathematics of genetic transmission.Shannon spent a year at the Institute for Advanced Study in Princeton, then in 1941 joined Bell Telephone Laboratories, where he began studying the relative efficiency of alternative transmission systems. Work on digital encryption systems during the Second World War led him to think that just as ciphers hide information from the enemy, "encoding" information could also protect it from noise. About 1948, he decided that the amount of information was best expressed quantitatively in a twovalue number system, using only the digits 0 and 1. John Tukey, a Princeton colleague, named these units "binary digits" (or, for short, "bits"). Almost all digital computers and communications systems use such onoff, or twostate logic as their basis of operation.Also in the 1940s, building on the work of H. Nyquist and R.V.L. Hartley, Shannon proved that there was an upper limit to the amount of information that could be transmitted through a communications channel in a unit of time, which could be approached but never reached because real transmissions are subject to interference (noise). This was the beginning of information theory, which has been used by others in attempts to quantify many sciences and technologies, as well as subjects in the humanities, but with mixed results. Before 1970, when integrated circuits were developed, Shannon's theory was not the preferred circuitandtransmission design tool it has since become.Shannon was also a pioneer in the field of artificial intelligence, claiming that computing machines could be used to manipulate symbols as well as do calculations. His 1953 paper on computers and automata proposed that digital computers were capable of tasks then thought exclusively the province of living organisms. In 1956 he left Bell Laboratories to join the MIT faculty as Professor of Communications Science.On the lighter side, Shannon has built many devices that play games, and in particular has made a scientific study of juggling.[br]Principal Honours and DistinctionsNational Medal of Science. Institute of Electrical and Electronics Engineers Medal of Honor, Kyoto Prize.BibliographyHis seminal paper (on what has subsequently become known as information theory) was entitled "The mathematical theory of communications", first published in Bell System Technical Journal in 1948; it is also available in a monograph (written with Warren Weaver) published by the University of Illinois Press in 1949, and in Key Papers in the Development of Information Theory, ed. David Slepian, IEEE Press, 1974, 1988. For readers who want all of Shannon's works, see N.J.A.Sloane and A.D.Wyner, 1992, TheCollected Papers of Claude E.Shannon.HO
Biographical history of technology.  Taylor & Francis eLibrar. Lance Day and Ian McNeil. 2005.